On Entropy and Lyapunov Exponents for Finite-State Channels

نویسندگان

  • Tim Holliday
  • Peter Glynn
  • Andrea Goldsmith
چکیده

The Finite-State Markov Channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this channel are equivalent to the largest Lyapunov exponents for a particular class of random matrix products. We then show that the Lyapunov exponents can be expressed as expectations with respect to the stationary distributions of a class of continuous-state space Markov chains. The stationary distributions for this class of Markov chains are shown to be unique and continuous functions of the input symbol probabilities, provided that the input sequence has finite memory. These properties allow us to express mutual information and channel capacity in terms of Lyapunov exponents. We then leverage this connection between entropy and Lyapunov exponents to develop a rigorous theory for computing or approximating entropy and mutual information for finite-state channels with dependent inputs. We develop a method for directly computing entropy of finite-state channels that does not rely on simulation and establish its convergence. We also obtain a new asymptotically tight lower bound for entropy based on norms of random matrix products. In addition, we prove a new functional central limit theorem for sample entropy and apply this theorem to characterize the error in simulated estimates of entropy. Finally, we present numerical examples of mutual information computation for ISI channels and observe the capacity benefits of adding memory to the input sequence for such channels.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Capacity of Finite State Markov Channels with General Inputs

We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...

متن کامل

Entropy and Mutual Information for Markov Channels with General Inputs

We study new formulas based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. Our methods allow for arbitrary input processes and channel dynamics, provided both have finite memory. We show that the entropy ra...

متن کامل

Always finite entropy and Lyapunov exponents of two-dimensional cellular automata

Given a new definition for the entropy of a cellular automata acting on a two-dimensional space, we propose an inequality between the entropy of the shift on a two-dimensional lattice and some angular analog of Lyapunov exponents.

متن کامل

Metastability, Lyapunov Exponents, Escape Rates, and Topological Entropy in Random Dynamical Systems

We explore the concept of metastability in random dynamical systems, focusing on connections between random Perron–Frobenius operator cocycles and escape rates of random maps, and on topological entropy of random shifts of finite type. The Lyapunov spectrum of the random Perron–Frobenius cocycle and the random adjacency matrix cocycle is used to decompose the random system into two disjoint ran...

متن کامل

Compressor performance, absolutely! - Data Compression Conference, 2002. Proceedings. DCC 2002

Kolmogorov (1958) recognised that by appropriately encoding phase-space trajectories of dynamical systems, one could in principle compute the corresponding Shannon entropies (in physics, referred to as the Kolmogorov-Sinai entropy) for each. Pesin [1] (1977) proved that for certain classes of dissipative non-linear systems, including one dimensional maps, the KS-entropy is precisely the sum of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003